Tutorial � King, sound localisation

Greg Detre

6th December 2002

Andy King

 

one ear � move head, move pinna

2 ways of detecting sounds � pressure (non-directional) waves

air movement (insects, directional, low-frequency)

longitudinal � American cockroaches � at the back end, cerci (protuberances) covered with slender hairs, different subsets activated by different directions

 

ventral cochlear nucleus � both types preserve time information

 

man can detect 3 azimuth = accuracy of 20ms ITD

an owl�s head is 1/10 size, just as accurate = 2ms disparities

even smaller flies can do it even better

the problem is not in the delay lines (in homeothermic animals)

but with the coincidence detectors

the time-course of an action potential is c. 10ms � so the summation happens too slwoly

2 dendritic trees � otherwise, don't seem special

 

stellate/bushy have funny synapses

some synapses release both excitatory and inhibitory transmitters

funny EPSP

2 of these matched up in time

lateral inhibition of an array of coincidence detectors

we have anatomical evidence, and we have ideas of how it could be done

 

inferior colliculus space-specific neurons

visual system map in tectum (mid-brain)

also snakes infrared system

the visual map seesm to be used to train the auditory map, recalibrating with growth + feather ruff etc.

 

azimuth ambiguity � front/back

owl sacrifices � intensity signals elevation, timing �/span> azimuth

but how do we manage elevation?

different reflections from the pinna, front/back and up/down asymmetrical

some enters directly, some reflected

CNS picks up on this, we don't know how

we�re almost as good at elevation as azimuth

false ears � relearn, reduced then improves

if we learn with false ears, then take them off, we flip straight back to the original system

this plasticity may be related to wax in ears, length of hair � still, it�s curious

why/how keep old system? different from prism glasses, relearn them then have to relearn back

microphone feed, lose elevation information, unless put ear shapes round microphone

 

owl medulla = equivalent to cochlear nucleus

cochlear nucleus � divided into dorsal + ventral

 

2 dendrites � helps with coincidence detection � how? reference in reading list

 

in the wild, intensity is informative about distance, especially if you recognise the sound and know how loud it should be

high frequency bits of the sound are differentially absorbed in the wild � if you recognise, you can use that spectral information for distance

 

moustached bats � FM squeaks with continuous tone

the FM indicates where in the tone its hearing the echo of

cochlear fovea � complex harmonic, listen for the echo of a particular one

cetacea also echolocate � also, the swifts and the South American owl(???) bird � both live in caves

mention bats for distance

Doppler shifts � distance and first derivative of distance

 

what/where in somatosensory system

S1 � where, good topographic maps

what � further back in PPC � definitely involved in active touch

 

efferent only to outer(???) � can change this size, tune up the cochlear

 

where does the auditory system do sound localisation?

mammals � brain stem

in bats, multiple cortical processing for elevation + azimuth etc.

also look at owls a bit, but they don't have a cerebral cortex (dolphins do???)

fMRI on dolphins? needs a big brain

dolphin = aquatic cows

 

fly (reading list) � negligible ITD

though there will be IID

time taken for auditory processing = very amplitude-dependent

interaction between amplitude and time delay

which accentuates the ITD

 

missing page???

there�s always ambiguity in sound localisation � cone of confusion

need to combine information from different techniques

in summary, the cochlear nucleus is most of all a timing device, preliminary processing for the levels above

phase-locking is produced by the mechanics of polarisation of the hair cells

< 4kHz (vowels are in this range)

can't discriminate phase-locking better with intensity � rate of firing but phase-locking is not a rate code

may be more useful for tonotopicity for frequency discrimination

critical band � measure of the frequency resolution of the human auditory system

how far apart in frequency two signals have to be to be discriminable

many of these filters in the auditory system � they look like tuning curves

hear the fundamental frequency even when it�s removed from its harmonies � wouldn't work with place coding

early single channel cochlear implants only signalled in one place, yet some people were able to get quite good frequency discrimination from this, using phase-locking

isofrequency band � orthogonal to tonotopic axis

hearing is ID representation � the 2nd isofrequency dimension represents another parameter, e.g. intensity or time

some neurons respond to how quickly you change the frequency/amplitude

maps aren't generally useful (except for us scientists)

auditory cortex � object recognition, complex echolocation, short-term memory and temporal unilateral lesion, only thing that definitely goes is sound localisation

Railsheker, Wang, DSP used to see what it is about the stimuli that a cell is responding to

King � Nature Nov, letters, �linear processing ��

haven't fully decided whether distance is important there

found neurons in areas around A1 sensitive to pairs of sounds (echoes) in mustache bats, distance area, velocity area

the bat uses a very limited range of pure sounds

more difficult for a wide range of communication sounds

fewer auditory studies on awake mokeys

more difficult to use earphones, but anesthesia assumed that weaker but not substantially different from awake

there�s no behaviour directed in anesthetised animals

 

Questions

why does being poikilothermic screw up the delay lines???

what about front/back???

phase-locking???

fly sound-localiser

head-centred???

why does the auditory system but not the visual system use space-specific cells???

how do we distinguish pinnae spectral transformation from inherent characteristics of the sound???

distance/level

MLD??

easier to have 3 ears???

fish anables(???) has 4 eyes (S American), swims at the surface

do we echolocate???

Fourier???

lots of pre-cortical auditory processing � any cortical sound localisation???

where in the auditory system does sound localisation happen???

can blind people echolocate at all??? he doesn't know

 

tonic release??? vs phasic???

can't have a 3D frequency/amplitude/time representation � auditory cortex of mustache bat (known best)

neuroethological approaches � A1 is tonotopic, in isofrequency dimension, the non-monotonic intensity � what would the label be for time???

map??? just any systematic map???